Pub. Date | : July, 2023 |
---|---|
Product Name | : The IUP Journal of Computer Sciences |
Product Type | : Article |
Product Code | : IJCS020723 |
Author Name | : Sandeep Bhattacharjee |
Availability | : YES |
Subject/Domain | : Management |
Download Format | : PDF Format |
No. of Pages | : 11 |
Language models are thought to be a mechanism for machines to comprehend and anticipate human languages as a component of contextually appropriate human communication. The paper attempts to comprehend the development of language models, commonly referred to as Generative Pre-trained Transformers (GPTs). Using text mining analytics in R4.2.2 console, the study attempts to locate keywords in 72 GPT blogs that appeared on the OPENAI website. Among them, the terms found were "learning," "openAI", "models" and "model." Moreover, correlation analysis revealed associations between terms like "appreciated," "creativity," "flex," "combine," and "connections." Academics, researchers and professionals working on business and information technology applications can all benefit greatly from this study.
The for-profit OpenAI LP and its nonprofit parent firm, OpenAI Inc., make up the American Artificial Intelligence (AI) research facility known as OpenAI. The company performs AI research with the declared intention of advancing and creating benign AI in a way that benefits all human beings. Elon Musk, Sam Altman, and others formed the group in San Francisco at the end of 2015, with a $1 bn commitment in total. In February 2018, Musk resigned from the board but continued to give money. Microsoft and Matthew Brown Companies invested $1 bn in OpenAI LP in 2019. The Pioneer Building in San Francisco's Mission District serves as the home office for OpenAI.1
Context, Text, Transformer, Trained, Predictive